4 found
Order:
Disambiguations
Pyeong Whan Cho [3]Pyeong W. Cho [1]
  1.  94
    Fractal Analysis Illuminates the Form of Connectionist Structural Gradualness.Whitney Tabor, Pyeong Whan Cho & Emily Szkudlarek - 2013 - Topics in Cognitive Science 5 (3):634-667.
    We examine two connectionist networks—a fractal learning neural network (FLNN) and a Simple Recurrent Network (SRN)—that are trained to process center-embedded symbol sequences. Previous work provides evidence that connectionist networks trained on infinite-state languages tend to form fractal encodings. Most such work focuses on simple counting recursion cases (e.g., anbn), which are not comparable to the complex recursive patterns seen in natural language syntax. Here, we consider exponential state growth cases (including mirror recursion), describe a new training scheme that seems (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  2.  19
    PIPS: A Parallel Planning Model of Sentence Production.Laurel Brehm, Pyeong Whan Cho, Paul Smolensky & Matthew A. Goldrick - 2022 - Cognitive Science 46 (2):e13079.
    Cognitive Science, Volume 46, Issue 2, February 2022.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  3.  29
    Discovery of a Recursive Principle: An Artificial Grammar Investigation of Human Learning of a Counting Recursion Language.Pyeong Whan Cho, Emily Szkudlarek & Whitney Tabor - 2016 - Frontiers in Psychology 7.
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark  
  4.  67
    Birth of an Abstraction: A Dynamical Systems Account of the Discovery of an Elsewhere Principle in a Category Learning Task.Whitney Tabor, Pyeong W. Cho & Harry Dankowicz - 2013 - Cognitive Science 37 (7):1193-1227.
    Human participants and recurrent (“connectionist”) neural networks were both trained on a categorization system abstractly similar to natural language systems involving irregular (“strong”) classes and a default class. Both the humans and the networks exhibited staged learning and a generalization pattern reminiscent of the Elsewhere Condition (Kiparsky, 1973). Previous connectionist accounts of related phenomena have often been vague about the nature of the networks’ encoding systems. We analyzed our network using dynamical systems theory, revealing topological and geometric properties that can (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark